Nearest neighbor pattern classification
نویسندگان
چکیده
The case of n unity-variance random variables x1, XZ,. * *, x, governed by the joint probability density w(xl, xz, * * * x,) is considered, where the density depends on the (normalized) cross-covariances pii = E[(xi jzi)(xi li)]. It is shown that the condition holds for an “arbitrary” function f(xl, x2, * * * , x,) of n variables if and only if the underlying density w(xl, XZ, * * * , x,) is the usual n-dimensional Gaussian density for correlated random variables. This result establishes a generalized form of Price’s theorem in which: 1) the relevant condition (*) subsumes Price’s original condition; 2) the proof is accomplished without appeal to Laplace integral expansions; and 3) conditions referring to derivatives with respect to diagonal terms pii are avoided, so that the unity variance assumption can be retained. Manuscript received February 10, 1966; revised May 2, 1966. The author is with the Ordnance Research Laboratory, Pennsylvania State University, State College, Pa. RICE’S THEOREM and its various extensions ([l]-[4]) have had great utility in the determination of output correlations between zero-memory nonlinearities subjected to jointly Gaussian inputs. In its original form, the theorem considered n jointly normal random variables, x1, x2, . . . x,, with respective means 21, LE2, . . . ) Z, and nth-order joint probability density, P(z 1, .x2, . 1 . , :r,,) = (27p ply,, y . exp { -a F F ;;;, ~ (2,. 2,)(:r, 5,) I , (1) where IM,l is the determinant’ of J1,, = [p,,], Pr-. = E[(sr 5$.)(x, ,2J] = xvx, &IL:, is the correlation coefficient of x, and x,, and Ail,, is the cofactor of p.? in ilf,. From [l], the theorem statement is as follows: “Let there be n zero-memory nonlinear devices specified by the input-output relationship f<(x), i = 1, 2, . 1 . , n. Let each xi be the single input to a corresponding fi(x) Authorized licensed use limited to: IEEE Xplore. Downloaded on December 22, 2008 at 14:33 from IEEE Xplore. Restrictions apply.
منابع مشابه
An Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification
The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial...
متن کاملAn Improved K-Nearest Neighbor with Crow Search Algorithm for Feature Selection in Text Documents Classification
The Internet provides easy access to a kind of library resources. However, classification of documents from a large amount of data is still an issue and demands time and energy to find certain documents. Classification of similar documents in specific classes of data can reduce the time for searching the required data, particularly text documents. This is further facilitated by using Artificial...
متن کاملEvaluation Accuracy of Nearest Neighbor Sampling Method in Zagross Forests
Collection of appropriate qualitative and quantitative data is necessary for proper management and planning. Used the suitable inventory methods is necessary and accuracy of sampling methods dependent the inventory net and number of sample point. Nearest neighbor sampling method is a one of distance methods and calculated by three equations (Byth and Riple, 1980; Cotam and Curtis, 1956 and Cota...
متن کاملAn Improved k-Nearest Neighbor Classification Algorithm Using Shared Nearest Neighbor Similarity
k-Nearest Neighbor (KNN) is one of the most popular algorithms for pattern recognition. Many researchers have found that the KNN classifier may decrease the precision of classification because of the uneven density of t raining samples .In view of the defect, an improved k-nearest neighbor algorithm is presented using shared nearest neighbor similarity which can compute similarity between test ...
متن کاملEvaluation Accuracy of Nearest Neighbor Sampling Method in Zagross Forests
Collection of appropriate qualitative and quantitative data is necessary for proper management and planning. Used the suitable inventory methods is necessary and accuracy of sampling methods dependent the inventory net and number of sample point. Nearest neighbor sampling method is a one of distance methods and calculated by three equations (Byth and Riple, 1980; Cotam and Curtis, 1956 and Cota...
متن کاملImproving nearest neighbor classification with cam weighted distance
Nearest neighbor (NN) classification assumes locally constant class conditional probabilities, and suffers from bias in high dimensions with a small sample set. In this paper, we propose a novel cam weighted distance to ameliorate the curse of dimensionality. Different from the existing neighborhood-based methods which only analyze a small space emanating from the query sample, the proposed nea...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Information Theory
دوره 13 شماره
صفحات -
تاریخ انتشار 1967